NEURAL NETWORKS AND APPROXIMATION BY SUPERPOSITION OFGAUSSIANSPaulo Jorge
نویسنده
چکیده
The aim of this paper is to discuss a nonlinear approximation problem relevant to the approximation of data by radial-basis-function neural networks. The approximation is based on superpositions of translated Gaus-sians. The method used enables us to give explicit approximations and error bounds. New connections between this problem and sampling theory are exposed, but the method used departs radically from those commonly used to obtain sampling results since (i) it applies to signals that are not band-limited, and possibly even discontinuous (ii) the sampling knots (the centers of the radial-basis functions) need not be equidis-tant (iii) the basic approximation building block is the Gaussian, not the usual sinc kernel. The results given ooer an answer to the following problem: how complex should a neural network be in order to be able to approximate a given signal to better than a certain prescribed accuracy? The results show that O(1=N) accuracy is possible with a network of N basis functions .
منابع مشابه
Neural networks and approximation by superposition of Gaussians
The aim of this paper is to discuss a nonlinear approximation problem relevant to the approximation of data by radial-basis-function neural networks. The approximation is based on superpositions of translated Gaussians. The method used enables us to give explicit approximations and error bounds. New connections between this problem and sampling theory are exposed, but the method used departs ra...
متن کاملSTRUCTURAL DAMAGE DETECTION BY MODEL UPDATING METHOD BASED ON CASCADE FEED-FORWARD NEURAL NETWORK AS AN EFFICIENT APPROXIMATION MECHANISM
Vibration based techniques of structural damage detection using model updating method, are computationally expensive for large-scale structures. In this study, after locating precisely the eventual damage of a structure using modal strain energy based index (MSEBI), To efficiently reduce the computational cost of model updating during the optimization process of damage severity detection, the M...
متن کاملComparison of the performances of neural networks specification, the Translog and the Fourier flexible forms when different production technologies are used
This paper investigates the performances of artificial neural networks approximation, the Translog and the Fourier flexible functional forms for the cost function, when different production technologies are used. Using simulated data bases, the author provides a comparison in terms of capability to reproduce input demands and in terms of the corresponding input elasticities of substitution esti...
متن کاملKolmogorov's spline network
In this paper, an innovative neural-network architecture is proposed and elucidated. This architecture, based on the Kolmogorov's superposition theorem (1957) and called the Kolmogorov's spline network (KSN), utilizes more degrees of adaptation to data than currently used neural-network architectures (NNAs). By using cubic spline technique of approximation, both for activation and internal func...
متن کاملHarmonic Analysis of Neural Networks
It is known that superpositions of ridge functions (single hidden-layer feedforward neural networks) may give good approximations to certain kinds of multivariate functions. It remains unclear, however, how to effectively obtain such approximations. In this paper, we use ideas from harmonic analysis to attack this question. We introduce a special admissibility condition for neural activation fu...
متن کامل